Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available March 17, 2026
-
Cohen, RJ (Ed.)Free, publicly-accessible full text available March 17, 2026
-
Free, publicly-accessible full text available January 22, 2026
-
Wriggers, Peter; Bischoff, Manfred; Oñate, Eugenio; Düster, Alexander; Zohdi, Tarek (Ed.)This study evidences that the particle surface-area-to-volume ratio (A/V) and the particle volume (V) have the key information of particle geometry and the ‘signature’ is realized by a power-law relationship between A/V and V in a form of V = (A/V)^α × β. We find that the power value α is influenced by the shape-size relationship while the β* term (β evaluated with a fixed value of α = -3) informs the average particle shape of a granular material regarding the overall angularity. This study also discusses how the particle shape can be retrieved in terms of Wadell’s true sphericity using the A/V and V. This concept is linked to another shape index M that interprets the particle shape as a function of surface area A, volume V, and size L. This paper explains the analytical aspects of geometric ‘signature’ and examines the idea using the example particles to address the DEM modelling-related questions.more » « less
-
Ranzato, M.; Beygelzimer, A.; Dauphin, Y.; Liang, P. S.; Wortman Vaughan, J. (Ed.)Bootstrapping has been a primary tool for ensemble and uncertainty quantification in machine learning and statistics. However, due to its nature of multiple training and resampling, bootstrapping deep neural networks is computationally burdensome; hence it has difficulties in practical application to the uncertainty estimation and related tasks. To overcome this computational bottleneck, we propose a novel approach called Neural Bootstrapper (NeuBoots), which learns to generate bootstrapped neural networks through single model training. NeuBoots injects the bootstrap weights into the high-level feature layers of the backbone network and outputs the bootstrapped predictions of the target, without additional parameters and the repetitive computations from scratch. We apply NeuBoots to various machine learning tasks related to uncertainty quantification, including prediction calibrations in image classification and semantic segmentation, active learning, and detection of out-of-distribution samples. Our empirical results show that NeuBoots outperforms other bagging based methods under a much lower computational cost without losing the validity of bootstrapping.more » « less
An official website of the United States government

Full Text Available